PyTorch Deep Learning Hands-On by Unknown
Author:Unknown
Language: eng
Format: mobi, epub
Publisher: Packt Publishing
EncoderBlock
This encodes part of the network, downsamples the input, and tries to get a compressed version of the input that contains the essence of the input. A basic building block of the encoder is the ConvBlock we developed before.
Figure 4.10: Encoder figure
As seen in the preceding diagram, each encoder block in LinkNet consists of four convolution blocks. The first two convolution blocks are grouped together as block one. This is then added with the residue output (an architectural decision motivated by ResNet). The residue output with that addition then goes to block two, which is also similar to block one. The input of block two is then added to the output of block two without passing through a separate residue block.
Block one downsamples the input with the factor of two and block two doesn't do anything with the dimension of the input. That is why we needed a residue net along with block one, while for block two we could add the input and output directly. The code that implements the architecture is as follows. The init function is essentially initializing the conv blocks and the residue blocks. PyTorch helps us to handle the addition of tensors, so that we just have to write the mathematical operations we wanted to do, just like how you do it on a normal Python variable, and PyTorch's autograd will take it from there.
class EncoderBlock(nn.Module): """ Residucal Block in linknet that does Encoding - layers in ResNet18 """ def __init__(self, inp, out): """ Resnet18 has first layer without downsampling. The parameter ''downsampling'' decides that # TODO - mention about how n - f/s + 1 is handling output size in # in downsample """ super().__init__() self.block1 = nn.Sequential( ConvBlock(inp=inp, out=out, kernal=3, stride=2, pad=1, bias=True, act=True), ConvBlock(inp=out, out=out, kernal=3, stride=1, pad=1, bias=True, act=True)) self.block2 = nn.Sequential( ConvBlock(inp=out, out=out, kernal=3, stride=1, pad=1, bias=True, act=True), ConvBlock(inp=out, out=out, kernal=3, stride=1, pad=1, bias=True, act=True)) self.residue = ConvBlock( inp=inp, out=out, kernal=3, stride=2, pad=1, bias=True, act=True) def forward(self, x): out1 = self.block1(x) residue = self.residue(x) out2 = self.block2(out1 + residue) return out2 + out1
Download
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.
Algorithms of the Intelligent Web by Haralambos Marmanis;Dmitry Babenko(8315)
Test-Driven Development with Java by Alan Mellor(6858)
Data Augmentation with Python by Duc Haba(6777)
Principles of Data Fabric by Sonia Mezzetta(6518)
Learn Blender Simulations the Right Way by Stephen Pearson(6424)
Microservices with Spring Boot 3 and Spring Cloud by Magnus Larsson(6285)
Hadoop in Practice by Alex Holmes(5967)
Jquery UI in Action : Master the concepts Of Jquery UI: A Step By Step Approach by ANMOL GOYAL(5817)
RPA Solution Architect's Handbook by Sachin Sahgal(5686)
Big Data Analysis with Python by Ivan Marin(5426)
The Infinite Retina by Robert Scoble Irena Cronin(5383)
Life 3.0: Being Human in the Age of Artificial Intelligence by Tegmark Max(5164)
Pretrain Vision and Large Language Models in Python by Emily Webber(4393)
Infrastructure as Code for Beginners by Russ McKendrick(4162)
Functional Programming in JavaScript by Mantyla Dan(4048)
The Age of Surveillance Capitalism by Shoshana Zuboff(3966)
WordPress Plugin Development Cookbook by Yannick Lefebvre(3875)
Embracing Microservices Design by Ovais Mehboob Ahmed Khan Nabil Siddiqui and Timothy Oleson(3674)
Applied Machine Learning for Healthcare and Life Sciences Using AWS by Ujjwal Ratan(3654)
